Telegram Group & Telegram Channel
πŸ€ The Hardware Lottery 🎰
by Sarah Hooker, Google Brain [ACM]

- The very first computer hardware was extremely focused on solving one particular problem - numerical differentiation or polynomial models. In the 1960s IBM invented the concept of Instruction Set and made migration between hardware easier for software developers. Till the 2010s we have been living in the world of general-purpose hardware - CPUs.

- Computer Science Ideas win or lose not because one superior one to another, but because some of them did not have the suitable hardware to be implemented in. Back Propagation Algorithm, the key algorithm that made the deep learning revolution possible, was invented independently in 1963, 1976, 1988 and finally applied to CNN in 1989. However, it was only three decades later that deep neural networks were widely accepted as a promising research direction and the significant result was achieved with GPUs, that could run massive parallel computations.

- Today hardware pendulum is swinging back to domain-specific hardware like it was the CPU invention

- Hardware should not remain a limiting factor for the breakthrough ideas in AI research. Hardware and Software should be codesigned for the SOTA algorithms. Algorithm developers need a deeper understanding of the computer platforms.

read also here



tg-me.com/pdp11ml/352
Create:
Last Update:

πŸ€ The Hardware Lottery 🎰
by Sarah Hooker, Google Brain [ACM]

- The very first computer hardware was extremely focused on solving one particular problem - numerical differentiation or polynomial models. In the 1960s IBM invented the concept of Instruction Set and made migration between hardware easier for software developers. Till the 2010s we have been living in the world of general-purpose hardware - CPUs.

- Computer Science Ideas win or lose not because one superior one to another, but because some of them did not have the suitable hardware to be implemented in. Back Propagation Algorithm, the key algorithm that made the deep learning revolution possible, was invented independently in 1963, 1976, 1988 and finally applied to CNN in 1989. However, it was only three decades later that deep neural networks were widely accepted as a promising research direction and the significant result was achieved with GPUs, that could run massive parallel computations.

- Today hardware pendulum is swinging back to domain-specific hardware like it was the CPU invention

- Hardware should not remain a limiting factor for the breakthrough ideas in AI research. Hardware and Software should be codesigned for the SOTA algorithms. Algorithm developers need a deeper understanding of the computer platforms.

read also here

BY PDP-11πŸš€


Warning: Undefined variable $i in /var/www/tg-me/post.php on line 283

Share with your friend now:
tg-me.com/pdp11ml/352

View MORE
Open in Telegram


PDP 11 Telegram | DID YOU KNOW?

Date: |

Start with a fresh view of investing strategy. The combination of risks and fads this quarter looks to be topping. That means the future is ready to move in.Likely, there will not be a wholesale shift. Company actions will aim to benefit from economic growth, inflationary pressures and a return of market-determined interest rates. In turn, all of that should drive the stock market and investment returns higher.

The seemingly negative pandemic effects and resource/product shortages are encouraging and allowing organizations to innovate and change.The news of cash-rich organizations getting ready for the post-Covid growth economy is a sign of more than capital spending plans. Cash provides a cushion for risk-taking and a tool for growth.

PDP 11 from no


Telegram PDP-11πŸš€
FROM USA